Adaptive estimation of multivariate piecewise polynomials and bounded variation functions by optimal decision trees

نویسندگان

چکیده

Proposed by Donoho (Ann. Statist. 25 (1997) 1870–1911), Dyadic CART is a nonparametric regression method which computes globally optimal dyadic decision tree and fits piecewise constant functions in two dimensions. In this article, we define study closely related estimator, namely Optimal Regression Tree (ORT), the context of estimating smooth general dimensions fixed design setup. More precisely, these estimators fit polynomials any given degree. Like dimensions, reason that can also be computed polynomial time sample size N via dynamic programming. We prove oracle inequalities for finite risk ORT, imply tight bounds several function classes interest. First, they ORT order r?0 always bounded CklogN whenever degree r on some reasonably regular axis aligned rectangular partition domain with at most k rectangles. Beyond univariate case, such guarantees are scarcely available literature computationally efficient estimators. Second, our uncover minimax rate optimality adaptivity estimator spaces variation. consider recent interest where multivariate total variation denoising trend filtering state art methods. show enjoys certain advantages over while still maintaining all their known guarantees.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multivariate piecewise polynomials

This article was supposed to be on `multivariate splines'. An informal survey, taken recently by asking various people in Approximation Theory what they consider to be a `multivariate spline', resulted in the answer that a multivariate spline is a possibly smooth, piecewise polynomial function of several arguments. In particular, the potentially very useful thin-plate spline was thought to belo...

متن کامل

Simple Learning Algorithms for Decision Trees and Multivariate Polynomials

In this paper we develop a new approach for learning decision trees and multivariate polynomials via interpolation of multivariate polynomials. This new approach yields simple learning algorithms for multivariate polynomials and decision trees over nite elds under any constant bounded product distribution. The output hypothesis is a (single) multivariate polynomial that is an-approximation of t...

متن کامل

On the Dimension of Multivariate Piecewise Polynomials

Lower bounds are given on the dimension of piecewise polynomial C 1 and C 2 functions de ned on a tessellation of a polyhedral domain into Tetrahedra. The analysis technique consists of embedding the space of interest into a larger space with a simpler structure, and then making appropriate adjustments. In the bivariate case, this approach reproduces the well-known lower bounds derived by Schum...

متن کامل

Bounded-degree factors of lacunary multivariate polynomials

In this paper, we present a new method for computing boundeddegree factors of lacunary multivariate polynomials. In particular for polynomials over number fields, we give a new algorithm that takes as input a multivariate polynomial f in lacunary representation and a degree bound d and computes the irreducible factors of degree at most d of f in time polynomial in the lacunary size of f and in ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Annals of Statistics

سال: 2021

ISSN: ['0090-5364', '2168-8966']

DOI: https://doi.org/10.1214/20-aos2045